sexually explicit deepfake
UK to ban deepfake AI 'nudification' apps
The UK government says it will ban so-called nudification apps as part of efforts to tackle misogyny online. New laws - announced on Thursday as part of a wider strategy to halve violence against women and girls - will make it illegal to create and supply AI tools letting users edit images to seemingly remove someone's clothing. The new offences would build on existing rules around sexually explicit deepfakes and intimate image abuse, the government said. Women and girls deserve to be safe online as well as offline, said Technology Secretary Liz Kendall. We will not stand by while technology is weaponised to abuse, humiliate and exploit them through the creation of non-consensual sexually explicit deepfakes.
- North America > United States (0.16)
- North America > Central America (0.15)
- Oceania > Australia (0.06)
- (15 more...)
- Law (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.69)
- Information Technology > Security & Privacy (0.59)
- (2 more...)
Leading deepfake porn site is shut down for good
In a development that can only be seen as positive, non-consensual deepfake porn site Mr. Deepfakes has shut down for good, reports 404 Media. This news comes due to the site losing one of its service providers. "A critical service provider has terminated service permanently. Data loss has made it impossible to continue operation," a notice on the site reads. "We will not be relaunching. Any website claiming this is fake. This domain will eventually expire and we are not responsible for future use. This message will be removed around one week."
- North America > United States (0.21)
- North America > Canada > Ontario > Toronto (0.08)
- Europe > United Kingdom (0.08)
UK regulator wants to ban apps that can make deepfake nude images of children
The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report. It states that such "nudification" apps have become so prevalent that many girls have stopped posting photos on social media. And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal. "Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone -- a stranger, a classmate, or even a friend -- could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps." said Children's Commissioner Dame Rachel de Souza.
- Europe > United Kingdom (0.52)
- North America > United States (0.06)
- North America > Canada (0.06)
- Information Technology > Security & Privacy (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.40)
- Government > Military > Cyberwarfare (0.40)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (0.36)
New UK law would criminalize creating sexually explicit deepfakes
Bad actors have created deepfakes to imitate celebrity endorsements, President Biden and employers. But, one of the most heinous uses is making sexually explicit deepfakes of real people. Now, the UK government is taking new steps to deter their creation, introducing new criminal offenses for producing or sharing sexually explicit deepfakes. Only sharing deepfakes is currently an offense under UK law. "With these new measures, we're sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal," said Baroness Margaret Beryl Jones, minister for the future digital economy and online safety.
High School Is Becoming a Cesspool of Sexually Explicit Deepfakes
For years now, generative AI has been used to conjure all sorts of realities--dazzling paintings and startling animations of worlds and people, both real and imagined. This power has brought with it a tremendous dark side that many experts are only now beginning to contend with: AI is being used to create nonconsensual, sexually explicit images and videos of children. And not just in a handful of cases--perhaps millions of kids nationwide have been affected in some way by the emergence of this technology, either directly victimized themselves or made aware of other students who have been. This morning, the Center for Democracy and Technology, a nonprofit that advocates for digital rights and privacy, released a report on the alarming prevalence of nonconsensual intimate imagery (or NCII) in American schools. In the past school year, the center's polling found, 15 percent of high schoolers reported hearing about a "deepfake"--or AI-generated image--that depicted someone associated with their school in a sexually explicit or intimate manner.
Meta needs updated rules for sexually explicit deepfakes, Oversight Board says
Meta's Oversight Board is urging the company to update its rules around sexually explicit deepfakes. The board made the recommendations as part of its decision in two cases involving AI-generated images of public figures. The cases stem from two user appeals over AI-generated images of public figures, though the board declined to name the individuals. One post, which originated on Instagram, depicted a nude Indian woman. The post was reported to Meta but the report was automatically closed after 48 hours, as was a subsequent user appeal.